1,537 research outputs found
Atomistic Line Graph Neural Network for Improved Materials Property Predictions
Graph neural networks (GNN) have been shown to provide substantial
performance improvements for representing and modeling atomistic materials
compared with descriptor-based machine-learning models. While most existing GNN
models for atomistic predictions are based on atomic distance information, they
do not explicitly incorporate bond angles, which are critical for
distinguishing many atomic structures. Furthermore, many material properties
are known to be sensitive to slight changes in bond angles. We present an
Atomistic Line Graph Neural Network (ALIGNN), a GNN architecture that performs
message passing on both the interatomic bond graph and its line graph
corresponding to bond angles. We demonstrate that angle information can be
explicitly and efficiently included, leading to improved performance on
multiple atomistic prediction tasks. We use ALIGNN models for predicting 52
solid-state and molecular properties available in the JARVIS-DFT, Materials
project, and QM9 databases. ALIGNN can outperform some previously reported GNN
models on atomistic prediction tasks by up to 85 % in accuracy with better or
comparable model training speed
ChemNLP: A Natural Language Processing based Library for Materials Chemistry Text Data
In this work, we present the ChemNLP library that can be used for 1) curating
open access datasets for materials and chemistry literature, developing and
comparing traditional machine learning, transformers and graph neural network
models for 2) classifying and clustering texts, 3) named entity recognition for
large-scale text-mining, 4) abstractive summarization for generating titles of
articles from abstracts, 5) text generation for suggesting abstracts from
titles, 6) integration with density functional theory dataset for identifying
potential candidate materials such as superconductors, and 7) web-interface
development for text and reference query. We primarily use the publicly
available arXiv and Pubchem datasets but the tools can be used for other
datasets as well. Moreover, as new models are developed, they can be easily
integrated in the library. ChemNLP is available at the websites:
https://github.com/usnistgov/chemnlp and https://jarvis.nist.gov/jarvischemnlp
Inverse design of next-generation superconductors using data-driven deep generative models
Over the past few decades, finding new superconductors with a high critical
temperature () has been a challenging task due to computational and
experimental costs. In this work, we present a diffusion model inspired by the
computer vision community to generate new superconductors with unique
structures and chemical compositions. Specifically, we used a crystal diffusion
variational autoencoder (CDVAE) along with atomistic line graph neural network
(ALIGNN) pretrained models and the Joint Automated Repository for Various
Integrated Simulations (JARVIS) superconducting database of density functional
theory (DFT) calculations to generate new superconductors with a high success
rate. We started with a DFT dataset of 1000 superconducting materials
to train the diffusion model. We used the model to generate 3000 new
structures, which along with pre-trained ALIGNN screening results in 62
candidates. For the top candidate structures, we carried out further DFT
calculations to validate our findings. Such approaches go beyond the typical
funnel-like materials design approaches and allow for the inverse design of
next-generation materials
- β¦